Learning Noisy Linear Threshold Functions

نویسنده

  • Tom Bylander
چکیده

This papers describes and analyzes algorithms for learning linear threshold function (LTFs) in the presence of classiication noise and monotonic noise. When there is classiication noise, each randomly drawn example is mislabeled (i.e., diiers from the target LTF) with the same probability. For monotonic noise, the probability of mis-labeling an example monotonically decreases with the separation between the target LTF hyperplane and the example. Monotonic noise is a generalization of classiication noise as well as the cases of independent binary features (aka naive Bayes) and normal distributions with equal covariance matrices. Monotonic noise provides a more realistic model of noise because it allows conndence to increase as a function of the distance from the threshold, but it does not impose any artiicial form on the function. This paper shows that LTFs are polynomially PAC-learnable in the presence of classiica-tion noise and monotonic noise if the separation between examples and the target LTF hyperplane is suuciently large, and if the vector length of each example is suuciently small.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Perceptron, Winnow, and PAC Learning

We analyze the performance of the widely studied Perceptron andWinnow algorithms for learning linear threshold functions under Valiant’s probably approximately correct (PAC) model of concept learning. We show that under the uniform distribution on boolean examples, the Perceptron algorithm can efficiently PAC learn nested functions (a class of linear threshold functions known to be hard for Per...

متن کامل

Simplicity Bias in the Estimation of Causal Functions

We ask observers to make judgments of the best causal functions underlying noisy test data. This method allows us to examine how people combine existing biases about causal relations with new information (the noisy data). Participants are shown n data points representing a sample of noisy data from a supposed experiment. They generate points on what they believe to be the true causal function. ...

متن کامل

Computationally efficient probabilistic inference with noisy threshold models based on a CP tensor decomposition

Conditional probability tables (CPTs) of threshold functions represent a generalization of two popular models – noisy-or and noisy-and. They constitute an alternative to these two models in case they are too rough. When using the standard inference techniques the inference complexity is exponential with respect to the number of parents of a variable. In case the CPTs take a special form (in thi...

متن کامل

Probabilistic inference with noisy-threshold models based on a CP tensor decomposition

The specification of conditional probability tables (CPTs) is a difficult task in the construction of probabilistic graphical models. Several types of canonical models have been proposed to ease that difficulty. Noisy-threshold models generalize the two most popular canonical models: the noisy-or and the noisy-and. When using the standard inference techniques the inference complexity is exponen...

متن کامل

Predicting carcinoid heart disease with the noisy-threshold classifier

OBJECTIVE To predict the development of carcinoid heart disease (CHD), which is a life-threatening complication of certain neuroendocrine tumors. To this end, a novel type of Bayesian classifier, known as the noisy-threshold classifier, is applied. MATERIALS AND METHODS Fifty-four cases of patients that suffered from a low-grade midgut carcinoid tumor, of which 22 patients developed CHD, were...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998